11 research outputs found

    Optical Fiber-Based Needle Shape Sensing in Real Tissue: Single Core vs. Multicore Approaches

    Full text link
    Flexible needle insertion procedures are common for minimally-invasive surgeries for diagnosing and treating prostate cancer. Bevel-tip needles provide physicians the capability to steer the needle during long insertions to avoid vital anatomical structures in the patient and reduce post-operative patient discomfort. To provide needle placement feedback to the physician, sensors are embedded into needles for determining the real-time 3D shape of the needle during operation without needing to visualize the needle intra-operatively. Through expansive research in fiber optics, a plethora of bio-compatible, MRI-compatible, optical shape-sensors have been developed to provide real-time shape feedback, such as single-core and multicore fiber Bragg gratings. In this paper, we directly compare single-core fiber-based and multicore fiber-based needle shape-sensing through identically constructed, four-active area sensorized bevel-tip needles inserted into phantom and \exvivo tissue on the same experimental platform. In this work, we found that for shape-sensing in phantom tissue, the two needles performed identically with a pp-value of 0.164>0.050.164 > 0.05, but in \exvivo real tissue, the single-core fiber sensorized needle significantly outperformed the multicore fiber configuration with a pp-value of 0.0005<0.050.0005 < 0.05. This paper also presents the experimental platform and method for directly comparing these optical shape sensors for the needle shape-sensing task, as well as provides direction, insight and required considerations for future work in constructively optimizing sensorized needles

    Multi-Contact Force-Sensing Guitar for Training and Therapy

    Full text link
    Hand injuries from repetitive high-strain and physical overload can hamper or even end a musician's career. To help musicians develop safer playing habits, we developed a multiplecontact force-sensing array that can substitute as a guitar fretboard. The system consists of 72 individual force sensing modules, each containing a flexure and a photointerrupter that measures the corresponding deflection when forces are applied. The system is capable of measuring forces between 0-25 N applied anywhere within the first 12 frets at a rate of 20 Hz with an average accuracy of 0.4 N and a resolution of 0.1 N. Accompanied with a GUI, the resulting prototype was received positively as a useful tool for learning and injury prevention by novice and expert musicians.Comment: IEEE Sensor Conference, 201

    Haptic-Enhanced Virtual Reality Simulator for Robot-Assisted Femur Fracture Surgery

    Full text link
    In this paper, we develop a virtual reality (VR) simulator for the Robossis robot-assisted femur fracture surgery. Due to the steep learning curve for such procedures, a VR simulator is essential for training surgeon(s) and staff. The Robossis Surgical Simulator (RSS) is designed to immerse user(s) in a realistic surgery setting using the Robossis system as completed in a previous real-world cadaveric procedure. The RSS is designed to interface the Sigma-7 Haptic Controller with the Robossis Surgical Robot (RSR) and the Meta Quest VR headset. Results show that the RSR follows user commands in 6 DOF and prevents the overlapping of bone segments. This development demonstrates a promising avenue for future implementation of the Robossis system.Comment: This paper is submitted to the IEEE Haptic Symposium 202

    Design and Experimental Evaluation of a Haptic Robot-Assisted System for Femur Fracture Surgery

    Full text link
    In the face of challenges encountered during femur fracture surgery, such as the high rates of malalignment and X-ray exposure to operating personnel, robot-assisted surgery has emerged as an alternative to conventional state-of-the-art surgical methods. This paper introduces the development of Robossis, a haptic system for robot-assisted femur fracture surgery. Robossis comprises a 7-DOF haptic controller and a 6-DOF surgical robot. A unilateral control architecture is developed to address the kinematic mismatch and the motion transfer between the haptic controller and the Robossis surgical robot. A real-time motion control pipeline is designed to address the motion transfer and evaluated through experimental testing. The analysis illustrates that the Robossis surgical robot can adhere to the desired trajectory from the haptic controller with an average translational error of 0.32 mm and a rotational error of 0.07 deg. Additionally, a haptic rendering pipeline is developed to resolve the kinematic mismatch by constraining the haptic controller (user hand) movement within the permissible joint limits of the Robossis surgical robot. Lastly, in a cadaveric lab test, the Robossis system assisted surgeons during a mock femur fracture surgery. The result shows that Robossis can provide an intuitive solution for surgeons to perform femur fracture surgery.Comment: This paper is to be submitted to an IEEE journa

    1.5 T augmented reality navigated interventional MRI: paravertebral sympathetic plexus injections

    Get PDF
    PURPOSE:The high contrast resolution and absent ionizing radiation of interventional magnetic resonance imaging (MRI) can be advantageous for paravertebral sympathetic nerve plexus injections. We assessed the feasibility and technical performance of MRI-guided paravertebral sympathetic injections utilizing augmented reality navigation and 1.5 T MRI scanner.METHODS:A total of 23 bilateral injections of the thoracic (8/23, 35%), lumbar (8/23, 35%), and hypogastric (7/23, 30%) paravertebral sympathetic plexus were prospectively planned in twelve human cadavers using a 1.5 Tesla (T) MRI scanner and augmented reality navigation system. MRI-conditional needles were used. Gadolinium-DTPA-enhanced saline was injected. Outcome variables included the number of control magnetic resonance images, target error of the needle tip, punctures of critical nontarget structures, distribution of the injected fluid, and procedure length.RESULTS: Augmented-reality navigated MRI guidance at 1.5 T provided detailed anatomical visualization for successful targeting of the paravertebral space, needle placement, and perineural paravertebral injections in 46 of 46 targets (100%). A mean of 2 images (range, 1–5 images) were required to control needle placement. Changes of the needle trajectory occurred in 9 of 46 targets (20%) and changes of needle advancement occurred in 6 of 46 targets (13%), which were statistically not related to spinal regions (P = 0.728 and P = 0.86, respectively) and cadaver sizes (P = 0.893 and P = 0.859, respectively). The mean error of the needle tip was 3.9±1.7 mm. There were no punctures of critical nontarget structures. The mean procedure length was 33±12 min.CONCLUSION:1.5 T augmented reality-navigated interventional MRI can provide accurate imaging guidance for perineural injections of the thoracic, lumbar, and hypogastric sympathetic plexus

    Robotic assistant for transperineal prostate interventions

    No full text
    Abstract. Numerous studies have demonstrated the efficacy of imageguided needle-based therapy and biopsy in the management of prostate cancer. The accuracy of traditional prostate interventions performed using transrectal ultrasound (TRUS) is limited by image fidelity, needle template guides, needle deflection and tissue deformation. Magnetic Resonance Imaging (MRI) is an ideal modality for guiding and monitoring such interventions due to its excellent visualization of the prostate, its sub-structure and surrounding tissues. We have designed a comprehensive robotic assistant system that allows prostate biopsy and brachytherapy procedures to be performed entirely inside a 3T closed MRI scanner. We present a detailed design of the robotic manipulator and an evaluation of its usability and MR compatibility
    corecore